Web Survey Bibliography
Title Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations
Year 2017
Access date 16.09.2017
Abstract According to social interface theory (Nass et al. 1996; Nass et al. 1997; Fogg & Nass 1996), humanizing information sent by the computer causes reactions typical of human-to-human interactions. Based on these findings, several methodological research studies have been conducted in order to investigate whether the regularities described by the psychologists can be observed in internet surveys in the form, among other things, of interviewer effect (e.g., Tourangeau et al. 2003; Couper et al. 2003; Fuchs 2009). These research results, however, seem to be inconsistent.
In this presentation, we report selected results from an experiment conducted in November and December 2016 among university students (N=900) as part of the research project funded by the Polish National Science Center. This project aims to estimate the influence of humanizing cues on the quality of the data obtained in internet surveys. The presentation shows the findings concerning the impact of the interviewer’s gender on the data. The experiment was based on the multifactorial plan in equal, completely randomized groups. Form of imitation/presence of the interviewer was our major independent variable (factor A) in four scenarios: (1) CAWI/text (with all stimuli presented in the form of text); (2) CAWI/photo (with stimuli presented in the form of text and an interviewer photo); (3) CAWI/movie (with all stimuli presented in the form of video of real interviewers and, additionally, the answers presented in the form of text); and (4) CAPI. The other independent variables were: interviewer gender (factor B, nested within factor A), with two values (male/female) and an extra version with no information for interviewer gender (“we”); interviewer (factor C, random factor nested within factors A & B), where five male and five female interviewers were engaged; and interviewer gender (factor D, constant factor) with two values (male/female).
In this presentation, we report selected results from an experiment conducted in November and December 2016 among university students (N=900) as part of the research project funded by the Polish National Science Center. This project aims to estimate the influence of humanizing cues on the quality of the data obtained in internet surveys. The presentation shows the findings concerning the impact of the interviewer’s gender on the data. The experiment was based on the multifactorial plan in equal, completely randomized groups. Form of imitation/presence of the interviewer was our major independent variable (factor A) in four scenarios: (1) CAWI/text (with all stimuli presented in the form of text); (2) CAWI/photo (with stimuli presented in the form of text and an interviewer photo); (3) CAWI/movie (with all stimuli presented in the form of video of real interviewers and, additionally, the answers presented in the form of text); and (4) CAPI. The other independent variables were: interviewer gender (factor B, nested within factor A), with two values (male/female) and an extra version with no information for interviewer gender (“we”); interviewer (factor C, random factor nested within factors A & B), where five male and five female interviewers were engaged; and interviewer gender (factor D, constant factor) with two values (male/female).
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.